3 research outputs found

    Using data mining techniques for the prediction of student dropouts from university science programs

    Get PDF
    Data Mining has taken a center stage in education for addressing student dropout challenges as it has become one of the major threat affecting Higher Educational Institutes (HEIs). Being able to predict students who are likely to dropout helps the university to assist those facing challenges early. This will results in producing more graduates with the intellectual capital who will provide skills in the industries, hence addressing the major challenge of skill shortage being faced in South Africa. Studies and researches as purported in literature have been done to address this major threat of dropout challenge by using the theoretical approach which banked on Tinto’s model, followed by the traditional and statistical approach. However, the two lacked accuracy and the automation aspect which makes them difficult and time-consuming to use as they require to be tested periodically for them to be validated. Recently data mining has become a vital tool for predicting non-linear phenomenon including where there is missing data and bringing about accuracy and automation aspect. Data mining usefulness and reliability assessment in education made it possible to be used for prediction by different researchers. As such this research used data mining approach that integrates classification and prediction techniques to analyze student academic data at the University of Fort Hare to create a model for student dropout using preentry data and university academic performance of each student. Following Knowledge Discovery from Database (KDD) framework, data for the students enrolled in the Bachelor of Science programs between 2003 and 2014 was selected. It went through preprocessing and transformation as to deal with the missing data and noise data. Classification algorithms were then used for student characterization. Decision trees (J48) which are found in Weka software were used to build the model for data mining and prediction. The reason for choosing decision trees was it’s ability to deal with textual, nominal and numeric data as was the case with our input data and because they have good precision.The model was then trained using a train data set, validated and evaluated with another data set. Experimental results demonstrations that data mining is useful in predicting students who have chances to drop out. A critical analysis of correctly classifying instances, the confusion matrix and ROC area shows that the model can correctly classify and predict those who are likely to dropout. The model accuracy was 66percent which is a good percentage as supported in literature which means the results produced can be reliably used for assessment and make strategic decisions. Furthermore, the model took a matter of seconds to compute the results when supplied with 400 instances which prove that it is effective and efficient. Grounding our conclusion from these experimental results, this research proved that Data Mining is useful for bringing about automation, accuracy in prediction of student dropouts and the results can be reliably depended on for decision making by faculty managers who are the decision makers

    Resource Allocation Framework in Fog Computing for the Internet of Things Environments

    Get PDF
    Fog computing plays a pivotal role in the Internet of Things (IoT) ecosystem because of its ability to support delay-sensitive tasks, bringing resources from cloud servers closer to the “ground” and support IoT devices that are resource-constrained. Although fog computing offers some benefits such as quick response to requests, geo-distributed data processing and data processing in the proximity of the IoT devices, the exponential increase of IoT devices and large volumes of data being generated has led to a new set of challenges. One such problem is the allocation of resources to IoT tasks to match their computational needs and quality of service (QoS) requirements, whilst meeting both task deadlines and user expectations. Most proposed solutions in existing works suggest task offloading mechanisms where IoT devices would offload their tasks randomly to the fog layer or cloud layer. This helps in minimizing the communication delay; however, most tasks would end up missing their deadlines as many delays are experienced during offloading. This study proposes and introduces a Resource Allocation Scheduler (RAS) at the IoT-Fog gateway, whose goal is to decide where and when a task is to be offloaded, either to the fog layer, or the cloud layer based on their priority needs, computational needs and QoS requirements. The aim directly places work within the communication networks domain, in the transport layer of the Open Systems Interconnection (OSI) model. As such, this study follows the four phases of the top-down approach because of its reusability characteristics. To validate and test the efficiency and effectiveness of the RAS, the fog framework was implemented and evaluated in a simulated smart home setup. The essential metrics that were used to check if round-trip time was minimized are the queuing time, offloading time and throughput for QoS. The results showed that the RAS helps to reduce the round-trip time, increases throughput and leads to improved QoS. Furthermore, the approach addressed the starvation problem, a phenomenon that tends to affect low priority tasks. Most importantly, the results provides evidence that if resource allocation and assignment are appropriately done, round-trip time can be reduced and QoS can be improved in fog computing. The significant contribution of this research is the novel framework which minimizes round-trip time, addresses the starvation problem and improves QoS. Moreover, a literature reviewed paper which was regarded by reviewers as the first, as far as QoS in fog computing is concerned was produced

    Resource Allocation Framework in Fog Computing for the Internet of Things Environments

    Get PDF
    Fog computing plays a pivotal role in the Internet of Things (IoT) ecosystem because of its ability to support delay-sensitive tasks, bringing resources from cloud servers closer to the “ground” and support IoT devices that are resource-constrained. Although fog computing offers some benefits such as quick response to requests, geo-distributed data processing and data processing in the proximity of the IoT devices, the exponential increase of IoT devices and large volumes of data being generated has led to a new set of challenges. One such problem is the allocation of resources to IoT tasks to match their computational needs and quality of service (QoS) requirements, whilst meeting both task deadlines and user expectations. Most proposed solutions in existing works suggest task offloading mechanisms where IoT devices would offload their tasks randomly to the fog layer or cloud layer. This helps in minimizing the communication delay; however, most tasks would end up missing their deadlines as many delays are experienced during offloading. This study proposes and introduces a Resource Allocation Scheduler (RAS) at the IoT-Fog gateway, whose goal is to decide where and when a task is to be offloaded, either to the fog layer, or the cloud layer based on their priority needs, computational needs and QoS requirements. The aim directly places work within the communication networks domain, in the transport layer of the Open Systems Interconnection (OSI) model. As such, this study follows the four phases of the top-down approach because of its reusability characteristics. To validate and test the efficiency and effectiveness of the RAS, the fog framework was implemented and evaluated in a simulated smart home setup. The essential metrics that were used to check if round-trip time was minimized are the queuing time, offloading time and throughput for QoS. The results showed that the RAS helps to reduce the round-trip time, increases throughput and leads to improved QoS. Furthermore, the approach addressed the starvation problem, a phenomenon that tends to affect low priority tasks. Most importantly, the results provides evidence that if resource allocation and assignment are appropriately done, round-trip time can be reduced and QoS can be improved in fog computing. The significant contribution of this research is the novel framework which minimizes round-trip time, addresses the starvation problem and improves QoS. Moreover, a literature reviewed paper which was regarded by reviewers as the first, as far as QoS in fog computing is concerned was produced
    corecore